Learning Parities with Structured Noise
نویسندگان
چکیده
In the learning parities with noise problem —well-studied in learning theory and cryptography— we have access to an oracle that, each time we press a button, returns a random vector a ∈ GF(2) together with a bit b ∈ GF(2) that was computed as a ·u+η, where u ∈ GF(2) is a secret vector, and η ∈ GF(2) is a noise bit that is 1 with some probability p. Say p = 1/3. The goal is to recover u. This task is conjectured to be intractable. Here we introduce a slight (?) variation of the model: upon pressing a button, we receive (say) 10 random vectors a1, a2, . . . , a10 ∈ GF(2), and corresponding bits b1, b2, . . . , b10, of which at most 3 are noisy. The oracle may arbitrarily decide which of the 10 bits to make noisy. We exhibit a polynomial-time algorithm to recover the secret vector u given such an oracle. We discuss generalizations of our result, including learning with more general noise patterns. We can also learn low-depth decision trees in the above structured noise model. We also consider the learning with errors problem over GF(q) and give (a) a 2 √ n) algorithm in our structured noise setting (b) a slightly subexponential algorithm when the gaussian noise is small.
منابع مشابه
New Algorithms for Learning in Presence of Errors
We give new algorithms for a variety of randomly-generated instances of computational problems using a linearization technique that reduces to solving a system of linear equations. These algorithms are derived in the context of learning with structured noise, a notion introduced in this paper. This notion is best illustrated with the learning parities with noise (LPN) problem —well-studied in l...
متن کاملOn Noise-Tolerant Learning of Sparse Parities and Related Problems
We consider the problem of learning sparse parities in the presence of noise. For learning parities on r out of n variables, we give an algorithm that runs in time poly ( log 1 δ , 1 1−2η ) n( +o(1))r/2 and uses only r log(n/δ)ω(1) (1−2η)2 samples in the random noise setting under the uniform distribution, where η is the noise rate and δ is the confidence parameter. From previously known result...
متن کاملOn learning k-parities with and without noise
We first consider the problem of learning k-parities in the on-line mistake-bound model: given a hidden vector x ∈ {0, 1} with |x| = k and a sequence of “questions” a1, a2, · · · ∈ {0, 1} , where the algorithm must reply to each question with 〈ai, x〉 (mod 2), what is the best tradeoff between the number of mistakes made by the algorithm and its time complexity? We improve the previous best resu...
متن کاملFinding Correlations in Subquadratic Time, with Applications to Learning Parities and Juntas with Noise PRELIMINARY VERSION
Given a set of n random d-dimensional boolean vectors with the promise that two of them are ρ-correlated with each other, how quickly can one find the two correlated vectors? We present a surprising and simple algorithm which, for any constant > 0 runs in (expected) time dn 3ω 4 + poly( 1 ρ ) < dn·poly( 1 ρ ), where ω < 2.4 is the exponent of matrix multiplication. This is the first subquadrati...
متن کاملEfficiency and Computational Limitations of Learning Algorithms
This thesis presents new positive and negative results concerning the learnability of several well-studied function classes in the Probably Approximately Correct (PAC) model of learning. Learning Disjunctive Normal Form (DNF) expressions in the PAC model is widely considered to be the main open problem in Computational Learning Theory. We prove that PAC learning of DNF expressions by an algorit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Electronic Colloquium on Computational Complexity (ECCC)
دوره 17 شماره
صفحات -
تاریخ انتشار 2010